Nanodegree key: nd892
Version: 5.0.0
Locale: en-us
Master the skills to get computers to understand, process, and manipulate human language. Build models on real data, and get hands-on experience with sentiment analysis, machine translation, and more.
Content
Part 01 (FreePreview): Welcome to Natural Language Processing
-
Module 01: Welcome to Natural Language Processing
-
Lesson 01: Welcome to Natural Language Processing
Welcome to a Free Preview of the Natural Language Processing Nanodegree program! Come and take a sneak peek at what our program offers.
-
Part 02 (FreePreview): Building an NLP Pipeline
-
Module 01: Building an NLP Pipeline
-
Lesson 01: Building an NLP Pipeline
Learn about text processing, feature extraction, and part-of-speech tagging.
- Concept 01: NLP and Pipelines
- Concept 02: How NLP Pipelines Work
- Concept 03: Text Processing
- Concept 04: Counting Words
- Concept 05: Feature Extraction
- Concept 06: Modeling
- Concept 07: Quiz: Split Sentences
- Concept 08: Part-of-Speech Tagging
- Concept 09: Named Entity Recognition
- Concept 10: Bag of Words
- Concept 11: TF-IDF
- Concept 12: One-Hot Encoding
- Concept 13: Word Embeddings
- Concept 14: t-SNE
-
Part 03 (FreePreview): Voice User Interfaces
-
Module 01: Voice User Interfaces
-
Lesson 01: Voice User Interfaces
Learn about phonetics, acoustic models, and deep neural networks.
- Concept 01: Welcome to Voice User Interfaces!
- Concept 02: VUI Overview
- Concept 03: VUI Applications
- Concept 04: Conversational AI with Alexa
- Concept 05: Lab: Space Geek
- Concept 06: Challenges in ASR
- Concept 07: Phonetics
- Concept 08: Quiz: Phonetics
- Concept 09: Acoustic Models and the Trouble with Time
- Concept 10: Language Models
- Concept 11: Deep Neural Networks as Speech Models
-
Part 04 (FreePreview): What's Next?
-
Module 01: What's Next?
-
Lesson 01: What's Next?
Here's how to keep building your NLP expertise!
-
Part 05 (FreePreview): Project
-
Module 01: Project
-
Lesson 01: Project: Part of Speech Tagging
In this project, you'll build a hidden Markov model for part of speech tagging with a universal tagset.
-
Part 06 : Introduction to Natural Language Processing
This section provides an overview of the program and introduces the fundamentals of Natural Language Processing through symbolic manipulation, including text cleaning, normalization, and tokenization. You'll then build a part of speech tagger using hidden Markov models.
-
Module 01: Introduction to the Nanodegree
-
Lesson 01: Welcome to Natural Language Processing
Welcome to the Natural Language Processing Nanodegree program!
-
Lesson 02: Knowledge, Community, and Careers
You are starting a challenging but rewarding journey! Take 5 minutes to read how to get help with projects and content.
-
Lesson 03: Get Help with Your Account
What to do if you have questions about your account or general questions about the program.
-
Lesson 04: Intro to NLP
Arpan will give you an overview of how to build a Natural Language Processing pipeline.
- Concept 01: Introducing Arpan
- Concept 02: NLP Overview
- Concept 03: Structured Languages
- Concept 04: Grammar
- Concept 05: Unstructured Text
- Concept 06: Counting Words
- Concept 07: Context Is Everything
- Concept 08: NLP and Pipelines
- Concept 09: How NLP Pipelines Work
- Concept 10: Text Processing
- Concept 11: Feature Extraction
- Concept 12: Modeling
-
Lesson 05: Text Processing
Learn to prepare text obtained from different sources for further processing, by cleaning, normalizing and splitting it into individual words or tokens.
- Concept 01: Text Processing
- Concept 02: Coding Exercises
- Concept 03: Introduction to GPU Workspaces
- Concept 04: Workspaces: Best Practices
- Concept 05: Text Processing Coding Examples
- Concept 06: Capturing Text Data
- Concept 07: Cleaning
- Concept 08: Normalization
- Concept 09: Tokenization
- Concept 10: Stop Word Removal
- Concept 11: Part-of-Speech Tagging
- Concept 12: Named Entity Recognition
- Concept 13: Stemming and Lemmatization
- Concept 14: Summary
-
Lesson 06: Spam Classifier with Naive Bayes
In this section, you'll learn how to build a spam e-mail classifier using the naive Bayes algorithm.
- Concept 01: Intro
- Concept 02: Guess the Person
- Concept 03: Known and Inferred
- Concept 04: Guess the Person Now
- Concept 05: Bayes Theorem
- Concept 06: Quiz: False Positives
- Concept 07: Solution: False Positives
- Concept 08: Bayesian Learning 1
- Concept 09: Bayesian Learning 2
- Concept 10: Bayesian Learning 3
- Concept 11: Naive Bayes Algorithm 1
- Concept 12: Naive Bayes Algorithm 2
- Concept 13: Building a Spam Classifier
- Concept 14: Project
- Concept 15: Spam Classifier - Workspace
- Concept 16: Outro
-
Lesson 07: Part of Speech Tagging with HMMs
Luis will give you an overview of several part-of-speech tagging, including a deeper dive on hidden Markov models.
- Concept 01: Intro
- Concept 02: Part of Speech Tagging
- Concept 03: Lookup Table
- Concept 04: Bigrams
- Concept 05: When bigrams won't work
- Concept 06: Hidden Markov Models
- Concept 07: Quiz: How many paths?
- Concept 08: Solution: How many paths
- Concept 09: Quiz: How many paths now?
- Concept 10: Quiz: Which path is more likely?
- Concept 11: Solution: Which path is more likely?
- Concept 12: Viterbi Algorithm Idea
- Concept 13: Viterbi Algorithm
- Concept 14: Further Reading
- Concept 15: Outro
-
Lesson 08: Project: Part of Speech Tagging
In this project, you'll build a hidden Markov model for part of speech tagging with a universal tagset.
-
Lesson 09: (Optional) IBM Watson Bookworm Lab
Learn how to build a simple question-answering agent using IBM Watson.
-
-
Module 02: Career Services
-
Lesson 01: Nanodegree Career Services
The Careers team at Udacity is here to help you move forward in your career - whether it's finding a new job, exploring a new career path, or applying new skills to your current job.
-
Lesson 02: Jobs in NLP
Learn about common jobs in natural language processing, and get tips on how to stay active in the community.
-
Lesson 03: Optimize Your GitHub Profile
Other professionals are collaborating on GitHub and growing their network. Submit your profile to ensure your profile is on par with leaders in your field.
- Concept 01: Prove Your Skills With GitHub
- Concept 02: Introduction
- Concept 03: GitHub profile important items
- Concept 04: Good GitHub repository
- Concept 05: Interview with Art - Part 1
- Concept 06: Identify fixes for example “bad” profile
- Concept 07: Quick Fixes #1
- Concept 08: Quick Fixes #2
- Concept 09: Writing READMEs with Walter
- Concept 10: Interview with Art - Part 2
- Concept 11: Commit messages best practices
- Concept 12: Reflect on your commit messages
- Concept 13: Participating in open source projects
- Concept 14: Interview with Art - Part 3
- Concept 15: Participating in open source projects 2
- Concept 16: Starring interesting repositories
- Concept 17: Next Steps
-
Part 07 : Computing with Natural Language
-
Module 01: Computing with Natural Language
-
Lesson 01: Feature extraction and embeddings
Transform text using methods like Bag-of-Words, TF-IDF, Word2Vec and GloVE to extract features that you can use in machine learning models.
-
Lesson 02: Topic Modeling
In this section, you'll learn to split a collection of documents into topics using Latent Dirichlet Analysis (LDA). In the lab, you'll be able to apply this model to a dataset of news articles.
- Concept 01: Intro
- Concept 02: References
- Concept 03: Bag of Words
- Concept 04: Latent Variables
- Concept 05: Matrix Multiplication
- Concept 06: Matrices
- Concept 07: Quiz: Picking Topics
- Concept 08: Solution: Picking Topics
- Concept 09: Beta Distributions
- Concept 10: Dirichlet Distributions
- Concept 11: Latent Dirichlet Allocation
- Concept 12: Sample a Topic
- Concept 13: Sample a Word
- Concept 14: Combining the Models
- Concept 15: Outro
- Concept 16: Notebook: Topic Modeling
- Concept 17: [SOLUTION] Topic Modeling
- Concept 18: Next Steps
-
Lesson 03: Sentiment Analysis
Learn about using several machine learning classifiers, including Recurrent Neural Networks, to predict the sentiment in text. Apply this to a dataset of movie reviews.
- Concept 01: Intro
- Concept 02: Sentiment Analysis with a Regular Classifier
- Concept 03: Notebook: Sentiment Analysis with a regular classifier
- Concept 04: [SOLUTION]: Sentiment Analysis with a regular clas
- Concept 05: Sentiment Analysis with RNN
- Concept 06: Notebook: Sentiment Analysis with an RNN
- Concept 07: [SOLUTION]: Sentiment Analysis with an RNN
- Concept 08: Optional Material
- Concept 09: Outro
-
Lesson 04: Sequence to Sequence
Here you'll learn about a specific architecture of RNNs for generating one sequence from another sequence. These RNNs are useful for chatbots, machine translation, and more!
-
Lesson 05: Deep Learning Attention
Attention is one of the most important recent innovations in deep learning. In this section, you'll learn attention, and you'll go over a basic implementation of it in the lab.
- Concept 01: Introduction to Attention
- Concept 02: Sequence to Sequence Recap
- Concept 03: Encoding -- Attention Overview
- Concept 04: Decoding -- Attention Overview
- Concept 05: Attention Overview
- Concept 06: Attention Encoder
- Concept 07: Attention Decoder
- Concept 08: Attention Encoder & Decoder
- Concept 09: Bahdanau and Luong Attention
- Concept 10: Multiplicative Attention
- Concept 11: Additive Attention
- Concept 12: Additive and Multiplicative Attention
- Concept 13: Computer Vision Applications
- Concept 14: NLP Application: Google Neural Machine Translation
- Concept 15: Other Attention Methods
- Concept 16: The Transformer and Self-Attention
- Concept 17: Notebook: Attention Basics
- Concept 18: [SOLUTION]: Attention Basics
- Concept 19: Outro
-
Lesson 06: RNN Keras Lab
This section will prepare you for the Machine Translation project. Here you will get hands-on practice with RNNs in Keras.
-
Lesson 07: Project: Machine Translation
Apply the skills you've learnt in Natural Language Processing to the challenging and extremely rewarding task of Machine Translation. Bonne chance!
-
-
Module 02: Career Services
-
Lesson 01: Take 30 Min to Improve your LinkedIn
Find your next job or connect with industry peers on LinkedIn. Ensure your profile attracts relevant leads that will grow your professional network.
- Concept 01: Get Opportunities with LinkedIn
- Concept 02: Use Your Story to Stand Out
- Concept 03: Why Use an Elevator Pitch
- Concept 04: Create Your Elevator Pitch
- Concept 05: Use Your Elevator Pitch on LinkedIn
- Concept 06: Create Your Profile With SEO In Mind
- Concept 07: Profile Essentials
- Concept 08: Work Experiences & Accomplishments
- Concept 09: Build and Strengthen Your Network
- Concept 10: Reaching Out on LinkedIn
- Concept 11: Boost Your Visibility
- Concept 12: Up Next
-
Part 08 : Communicating with Natural Language
-
Module 01: Communicating in Natural Language
-
Lesson 01: Intro to Voice User Interfaces
Get acquainted with the principles and applications of VUI, and get introduced to Alexa skills.
-
Lesson 02: (Optional) Alexa History Skill
Build your own Alexa skill and deploy it!
-
Lesson 03: Speech Recognition
Learn how an ASR pipeline works.
- Concept 01: Intro
- Concept 02: Challenges in ASR
- Concept 03: Signal Analysis
- Concept 04: References: Signal Analysis
- Concept 05: Quiz: FFT
- Concept 06: Feature Extraction with MFCC
- Concept 07: References: Feature Extraction
- Concept 08: Quiz: MFCC
- Concept 09: Phonetics
- Concept 10: References: Phonetics
- Concept 11: Quiz: Phonetics
- Concept 12: Voice Data Lab Introduction
- Concept 13: Lab: Voice Data
- Concept 14: Acoustic Models and the Trouble with Time
- Concept 15: HMMs in Speech Recognition
- Concept 16: Language Models
- Concept 17: N-Grams
- Concept 18: Quiz: N-Grams
- Concept 19: References: Traditional ASR
- Concept 20: A New Paradigm
- Concept 21: Deep Neural Networks as Speech Models
- Concept 22: Connectionist Tempora Classification (CTC)
- Concept 23: References: Deep Neural Network ASR
- Concept 24: Outro
-
Lesson 04: Project: DNN Speech Recognizer
Build a deep neural network that functions as part of an end-to-end automatic speech recognition pipeline.
-
Part 09 (Elective): Recurrent Neural Networks
-
Module 01: Recurrent Neural Networks
-
Lesson 01: Recurrent Neural Networks
Ortal will introduce Recurrent Neural Networks (RNNs), which are machine learning models that are able to recognize and act on sequences of inputs.
- Concept 01: Introducing Ortal
- Concept 02: RNN Introduction
- Concept 03: RNN History
- Concept 04: RNN Applications
- Concept 05: Feedforward Neural Network-Reminder
- Concept 06: The Feedforward Process
- Concept 07: Feedforward Quiz
- Concept 08: Backpropagation- Theory
- Concept 09: Backpropagation - Example (part a)
- Concept 10: Backpropagation- Example (part b)
- Concept 11: Backpropagation Quiz
- Concept 12: RNN (part a)
- Concept 13: RNN (part b)
- Concept 14: RNN- Unfolded Model
- Concept 15: Unfolded Model Quiz
- Concept 16: RNN- Example
- Concept 17: Backpropagation Through Time (part a)
- Concept 18: Backpropagation Through Time (part b)
- Concept 19: Backpropagation Through Time (part c)
- Concept 20: BPTT Quiz 1
- Concept 21: BPTT Quiz 2
- Concept 22: BPTT Quiz 3
- Concept 23: Some more math
- Concept 24: RNN Summary
- Concept 25: From RNN to LSTM
- Concept 26: Wrap Up
-
Lesson 02: Long Short-Term Memory Networks (LSTM)
Luis explains Long Short-Term Memory Networks (LSTM), and similar architectures which have the benefits of preserving long term memory.
- Concept 01: Intro to LSTM
- Concept 02: RNN vs LSTM
- Concept 03: Basics of LSTM
- Concept 04: Architecture of LSTM
- Concept 05: The Learn Gate
- Concept 06: The Forget Gate
- Concept 07: The Remember Gate
- Concept 08: The Use Gate
- Concept 09: Putting it All Together
- Concept 10: Quiz
- Concept 11: Other architectures
- Concept 12: Outro LSTM
-
Lesson 03: Hyperparameters
Learn about a number of different hyperparameters that are used in defining and training deep learning models. We'll discuss starting values and intuitions for tuning each hyperparameter.
- Concept 01: Introducing Jay
- Concept 02: Introduction
- Concept 03: Learning Rate
- Concept 04: Learning Rate
- Concept 05: Minibatch Size
- Concept 06: Number of Training Iterations / Epochs
- Concept 07: Number of Hidden Units / Layers
- Concept 08: RNN Hyperparameters
- Concept 09: RNN Hyperparameters
- Concept 10: Sources & References
-
Part 10 (Elective): Keras
-
Module 01: Keras
-
Lesson 01: Keras
In this section you'll get a hands-on introduction to Keras. You'll learn to apply it to analyze movie reviews.
-
Part 11 (Elective): Sentiment Analysis Extras
-
Module 01: Sentiment Analysis
-
Lesson 01: Sentiment Analysis with Andrew Trask
- Concept 01: Meet Andrew
- Concept 02: Materials
- Concept 03: The Notebooks
- Concept 04: Framing the Problem
- Concept 05: Mini Project 1
- Concept 06: Mini Project 1 Solution
- Concept 07: Transforming Text into Numbers
- Concept 08: Mini Project 2
- Concept 09: Mini Project 2 Solution
- Concept 10: Building a Neural Network
- Concept 11: Mini Project 3
- Concept 12: Mini Project 3 Solution
- Concept 13: Understanding Neural Noise
- Concept 14: Mini Project 4
- Concept 15: Understanding Inefficiencies in our Network
- Concept 16: Mini Project 5
- Concept 17: Mini Project 5 Solution
- Concept 18: Further Noise Reduction
- Concept 19: Mini Project 6
- Concept 20: Mini Project 6 Solution
- Concept 21: Analysis: What's Going on in the Weights?
- Concept 22: Conclusion
-
Part 12 (Elective): TensorFlow
-
Module 01: TensorFlow
-
Lesson 01: TensorFlow
In this section you'll get a hands-on introduction to TensorFlow, Google's deep learning framework, and you'll be able to apply it on an image dataset.
- Concept 01: Intro
- Concept 02: Installing TensorFlow
- Concept 03: Hello, Tensor World!
- Concept 04: Quiz: TensorFlow Input
- Concept 05: Quiz: TensorFlow Math
- Concept 06: Quiz: TensorFlow Linear Function
- Concept 07: Quiz: TensorFlow Softmax
- Concept 08: Quiz: TensorFlow Cross Entropy
- Concept 09: Quiz: Mini-batch
- Concept 10: Epochs
- Concept 11: Pre-Lab: NotMNIST in TensorFlow
- Concept 12: Lab: NotMNIST in TensorFlow
- Concept 13: Two-layer Neural Network
- Concept 14: Quiz: TensorFlow ReLUs
- Concept 15: Deep Neural Network in TensorFlow
- Concept 16: Save and Restore TensorFlow Models
- Concept 17: Finetuning
- Concept 18: Quiz: TensorFlow Dropout
- Concept 19: Outro
-
Part 13 (Elective): Embeddings and Word2Vec
-
Module 01: Embeddings and Word2Vec
-
Lesson 01: Embeddings and Word2Vec
In this lesson, you'll learn about embeddings in neural networks by implementing the word2vec model.
- Concept 01: Additional NLP Lessons
- Concept 02: Embeddings Intro
- Concept 03: Implementing Word2Vec
- Concept 04: Subsampling Solution
- Concept 05: Making Batches
- Concept 06: Batches Solution
- Concept 07: Building the Network
- Concept 08: Negative Sampling
- Concept 09: Building the Network Solution
- Concept 10: Training Results
-
Part 14 (Elective): PyTorch
-
Module 01: Introduction to PyTorch
-
Lesson 01: Introduction to PyTorch
Learn how to use PyTorch to build and train deep neural networks. By the end of this lesson, you will build a network that can classify images of dogs and cats with state-of-the-art performance.
- Concept 01: Welcome!
- Concept 02: Single layer neural networks
- Concept 03: Single layer neural networks solution
- Concept 04: Networks Using Matrix Multiplication
- Concept 05: Multilayer Networks Solution
- Concept 06: Neural Networks in PyTorch
- Concept 07: Neural Networks Solution
- Concept 08: Implementing Softmax Solution
- Concept 09: Network Architectures in PyTorch
- Concept 10: Network Architectures Solution
- Concept 11: Training a Network Solution
- Concept 12: Classifying Fashion-MNIST
- Concept 13: Fashion-MNIST Solution
- Concept 14: Inference and Validation
- Concept 15: Validation Solution
- Concept 16: Dropout Solution
- Concept 17: Saving and Loading Models
- Concept 18: Loading Image Data
- Concept 19: Loading Image Data Solution
- Concept 20: Transfer Learning II
- Concept 21: Transfer Learning Solution
- Concept 22: Tips, Tricks, and Other Notes
-
Lesson 02: Embeddings & Word2Vec
In this lesson, you'll learn about embeddings in neural networks by implementing the Word2Vec model.
- Concept 01: Word Embeddings
- Concept 02: Embedding Weight Matrix/Lookup Table
- Concept 03: Word2Vec Notebook
- Concept 04: Pre-Notebook: Word2Vec, SkipGram
- Concept 05: Notebook: Word2Vec, SkipGram
- Concept 06: Data & Subsampling
- Concept 07: Subsampling Solution
- Concept 08: Context Word Targets
- Concept 09: Batching Data, Solution
- Concept 10: Word2Vec Model
- Concept 11: Model & Validations
- Concept 12: Negative Sampling
- Concept 13: Pre-Notebook: Negative Sampling
- Concept 14: Notebook: Negative Sampling
- Concept 15: SkipGramNeg, Model Definition
- Concept 16: Complete Model & Custom Loss
-
Lesson 03: Implementation of RNN & LSTM
Learn how to represent memory in code. Then define and train RNNs in PyTorch and apply them to tasks that involve sequential data.
- Concept 01: Implementing RNNs
- Concept 02: Time-Series Prediction
- Concept 03: Training & Memory
- Concept 04: Character-wise RNNs
- Concept 05: Sequence Batching
- Concept 06: Pre-Notebook: Character-Level RNN
- Concept 07: Notebook: Character-Level RNN
- Concept 08: Implementing a Char-RNN
- Concept 09: Batching Data, Solution
- Concept 10: Defining the Model
- Concept 11: Char-RNN, Solution
- Concept 12: Making Predictions
-
Lesson 04: Deploying PyTorch Models
In this lesson, we'll walk through a tutorial showing how to deploy PyTorch models with Torch Script.
-
Part 15 (Elective): Additional Text Preprocessing
-
Module 01: Python Regular Expression
-
Module 02: BeautifulSoup
-
Lesson 01: BeautifulSoup Library
- Concept 01: Introduction to BeautifulSoup
- Concept 02: Parsers
- Concept 03: HTML Structure
- Concept 04: Parsing an HTML File
- Concept 05: Navigating The Parse Tree
- Concept 06: Searching The Parse Tree
- Concept 07: Searching by Class and Regexes
- Concept 08: Children Tags
- Concept 09: Exercise: Get Headers and Paragraphs
- Concept 10: The Requests Library
-